Multimodal Emotion Recognition From EEG Signals and Facial Expressions

نویسندگان

چکیده

Emotion recognition has attracted a lot of attention in recent years. It is widely used health care, teaching, human-computer interaction, and other fields. Various human emotional features are often to recognize different emotions. Currently, there more research on multimodal emotion based the fusion multiple features. This paper proposes deep learning model electroencephalogram (EEG) signals facial expressions achieve an excellent classification effect. First, pre-trained convolution neural network (CNN) extract from expressions. Next, mechanism introduced critical frame Then, we apply CNNs spatial original EEG signals, which use local kernel global learn left right hemispheres channels all channels. After feature-level fusion, expression fed into classifier for recognition. conducted experiments DEAP MAHNOB-HCI datasets evaluate performance proposed model. The accuracy valence dimension 96.63%, arousal 97.15% dataset, while 96.69% 96.26% dataset. experimental results show that can effectively carry out

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fusion of Facial Expressions and EEG for Multimodal Emotion Recognition

This paper proposes two multimodal fusion methods between brain and peripheral signals for emotion recognition. The input signals are electroencephalogram and facial expression. The stimuli are based on a subset of movie clips that correspond to four specific areas of valance-arousal emotional space (happiness, neutral, sadness, and fear). For facial expression detection, four basic emotion sta...

متن کامل

Neuropsychosocial Factors in Emotion Recognition: Facial Expressions

It has often been said that the eyes are the “window to the soul.” This statement may be carried to a logical assumption that not only the eyes but the entire face may reflect the “hidden” emotions of the individual. With this assumption in mind, this paper will describe the various theories subsumed within the rubric of emotion recognition from the fields of social psychology and human neurops...

متن کامل

Emotion recognition from expressions in face, voice, and body: the Multimodal Emotion Recognition Test (MERT).

Emotion recognition ability has been identified as a central component of emotional competence. We describe the development of an instrument that objectively measures this ability on the basis of actor portrayals of dynamic expressions of 10 emotions (2 variants each for 5 emotion families), operationalized as recognition accuracy in 4 presentation modes combining the visual and auditory sense ...

متن کامل

Emotion Recognition from Facial Expressions using Multilevel HMM

Human-computer intelligent interaction (HCII) is an emerging field of science aimed at providing natural ways for humans to use computers as aids. It is argued that for the computer to be able to interact with humans, it needs to have the communication skills of humans. One of these skills is the ability to understand the emotional state of the person. The most expressive way humans display emo...

متن کامل

Cognitive penetrability and emotion recognition in human facial expressions

Do our background beliefs, desires, and mental images influence our perceptual experience of the emotions of others? In this paper, we will address the possibility of cognitive penetration (CP) of perceptual experience in the domain of social cognition. In particular, we focus on emotion recognition based on the visual experience of facial expressions. After introducing the current debate on CP...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2023

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2023.3263670